Skip to content

Conversation

@fw7th
Copy link
Contributor

@fw7th fw7th commented Dec 24, 2025

What this does

  • Replaces Bilinear Transformation's einsum implementation with matmul

Why

  • Performance increase from MatMul over Einsum

Testing

  • Existing tests pass for function implementation

Resolves #2573

@fw7th
Copy link
Contributor Author

fw7th commented Dec 24, 2025

microsoft

@microsoft-github-policy-service agree

@fw7th
Copy link
Contributor Author

fw7th commented Dec 24, 2025

Note on test failure

The failure appears unrelated to the bilinear changes. The error is raised during test input generation for upsample_bilinear2d_* in full-graph mode
(ValueError: Seed must be between 0 and 2**32 - 1).

This suggests an RNG seed issue in the test harness rather than an op
implementation problem. I can reproduce it consistently locally.

@codecov
Copy link

codecov bot commented Dec 24, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 70.11%. Comparing base (72321e5) to head (292b8eb).
⚠️ Report is 1 commits behind head on main.
✅ All tests successful. No failed tests found.

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #2746   +/-   ##
=======================================
  Coverage   70.10%   70.11%           
=======================================
  Files         228      228           
  Lines       27387    27396    +9     
  Branches     2785     2785           
=======================================
+ Hits        19201    19210    +9     
  Misses       7227     7227           
  Partials      959      959           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@justinchuby justinchuby added the module: torchlib Related to the torch/aten function lib in development label Dec 29, 2025
@justinchuby justinchuby requested a review from xadupre December 29, 2025 18:53
Copy link
Collaborator

@justinchuby justinchuby left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. @xadupre or @gramalingam for another review

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR aims to optimize the aten_bilinear function by replacing the Einstein summation (einsum) implementation with matrix multiplication (matmul) operations for improved performance.

Key changes:

  • Replaced single einsum operation with a sequence of transpose, reshape, matmul, and squeeze operations
  • Added explicit shape extraction for batch dimensions and feature dimensions
  • Modified computation flow to use matmul instead of einsum for the bilinear transformation

Copy link
Collaborator

@justinchuby justinchuby left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Found some issues

@github-project-automation github-project-automation bot moved this from Done to In Progress in ONNX Script Review Board Dec 30, 2025
@fw7th
Copy link
Contributor Author

fw7th commented Jan 6, 2026

Hey @justinchuby! I fixed the batch-dim issue by switching to Shape(input1, start=0, end=-1) so all leading dims are preserved (the op requires identical batch shapes anyway).

Tests still intermittently fail due to the existing RNG seed issue (same as the upsample_bilinear2d tests), which seems unrelated to this change.

To better guard against regressions here, would you like me to add a test case for bilinear with batch_dim > 1? I can do that in a follow-up PR if that’s preferable.

@fw7th fw7th requested a review from justinchuby January 6, 2026 07:23
@justinchuby
Copy link
Collaborator

justinchuby commented Jan 6, 2026

To better guard against regressions here, would you like me to add a test case for bilinear with batch_dim > 1? I can do that in a follow-up PR if that’s preferable.

That would be ideal, thanks! Yes a follow up PR is preferable.

@justinchuby justinchuby self-assigned this Jan 6, 2026
Copy link
Collaborator

@justinchuby justinchuby left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you!

@github-project-automation github-project-automation bot moved this from In Progress to Done in ONNX Script Review Board Jan 6, 2026
@fw7th
Copy link
Contributor Author

fw7th commented Jan 6, 2026

Thank you!

same as well chief

@justinchuby justinchuby merged commit 6e91205 into microsoft:main Jan 6, 2026
30 of 32 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

module: torchlib Related to the torch/aten function lib in development

Projects

Development

Successfully merging this pull request may close these issues.

Implement aten_bilinear

2 participants